35 research outputs found

    3D Scanning System for Automatic High-Resolution Plant Phenotyping

    Full text link
    Thin leaves, fine stems, self-occlusion, non-rigid and slowly changing structures make plants difficult for three-dimensional (3D) scanning and reconstruction -- two critical steps in automated visual phenotyping. Many current solutions such as laser scanning, structured light, and multiview stereo can struggle to acquire usable 3D models because of limitations in scanning resolution and calibration accuracy. In response, we have developed a fast, low-cost, 3D scanning platform to image plants on a rotating stage with two tilting DSLR cameras centred on the plant. This uses new methods of camera calibration and background removal to achieve high-accuracy 3D reconstruction. We assessed the system's accuracy using a 3D visual hull reconstruction algorithm applied on 2 plastic models of dicotyledonous plants, 2 sorghum plants and 2 wheat plants across different sets of tilt angles. Scan times ranged from 3 minutes (to capture 72 images using 2 tilt angles), to 30 minutes (to capture 360 images using 10 tilt angles). The leaf lengths, widths, areas and perimeters of the plastic models were measured manually and compared to measurements from the scanning system: results were within 3-4% of each other. The 3D reconstructions obtained with the scanning system show excellent geometric agreement with all six plant specimens, even plants with thin leaves and fine stems.Comment: 8 papes, DICTA 201

    Three Dimensional (3D) Reconstruction of Subterranean Clover

    Get PDF
    Three dimensional (3D) plant reconstructions, extended to four dimensions with the use of time series and accompanied by visual modelling, is being used for a number of purposes including the estimation of biovolume and as the basis for functional structural plant modelling (FSPM). This has been successfully applied to crop species such as cotton (Paproki et al. 2012). Measuring the growth pattern and arrangement of a pasture sward is a difficult task but can be used as an indirect measure of other variables of interest, such as growth rate, light interception, nutritional quality, herbivore intake, etc. (Laca and Lemaire 2000). Digital representation of individual plants in three dimensions is one way to determine sward structure. The High Resolution Plant Phenomics Centre (HRPPC) has developed PlantScanℱ which combines robotics, image analysis and computing advances, to accelerate and automate the measurement of plant growth characteristics and allow discrimination of differences between individual plants within species. Image silhouettes and LiDAR (Light Detection And Ranging) are used and combined to digitise plant architecture in three dimensions with a high level of detail. Colour information, extracted from multispectral sensors, and thermal imaging from infra-red (IR) cameras are then overlaid on these 3D plant representations, thus providing a tool to link plant structure to plant function. Successful reconstructions using data collected by PlantScanℱ in controlled conditions, have been conducted for a range of grasses such as wheat (Triticum aestivum), rice (Oryza sativa), corn (Zea mays) and broadleaf species such as canola (Brassica napus), cotton (Gossypium hirsutum) and tobacco (Nicotiana tabacum). This suggests that modelling the sward structure of grass and legume pasture species should be equally achievable. This study explores the use of PlantScanTM to reconstruct 3D images of the important and common pasture legume, subterranean clover (Trifolium subterraneum) with a view to analysing their 3D structure in-silico

    Flexible scientific data management for plant phenomics research

    Get PDF
    In this paper, we expand on the design and implementation of the Phenomics Ontology Driven Data repository [1] (PODD) with respect to the capture, storage and retrieval of data and metadata gen- erated at the High Resolution Plant Phenomics Centre (Canberra, Aus- tralia). PODD is a schema-driven Semantic Web database which uses the Resource Description Framework (RDF) model to store semi-structured information. RDF allows PODD to process information about a range of phenomics experiments without needing to define a universal schema for all of the di ff erent structures. To illustrate the process, exemplar datasets were generated using a medium throughput, high resolution, three-dimensional digitisation system purposely built for studying plant structure and function simultaneously under specific environmental con- ditions. The High Performance Compute (HPC), storage and data collec- tion publication aspects of the workflow and their realisation in CSIRO infrastructure are also discussed along with their relationship to PODD

    Phenomenal: a software framework for model-assisted analysis of high throughput plant phenotyping data

    Get PDF
    International audiencePlant high-throughput phenotyping aims at capturing the genetic variability of plant response to environmental factors for thousands of plants, hence identifying heritable traits for genomic selection and predicting the genetic values of allelic combinations in different environment. This first implies the automation of the measurement of a large number of traits to characterize plant growth, plant development and plant functioning. It also requires a fluent and versatile interaction between data and continuously evolving plant response models, that are essential in the analysis of the marker x environment interaction and in the integration of processes for predicting crop performance [1]. In the frame of the Phenome high throughput phenotyping infrastructure, we develop Phenomenal: a software framework dedicated to the analysis of high throughput phenotyping data and models. It is based on the OpenAlea platform [2] that provides methods and softwares for the modelling of plants, together with a user-friendly interface for the design and execution of scientific workflows. OpenAlea is also part of the InfraPhenoGrid infrastructure that allows high throughput computation and recording of provenance during the execution [3]. Figure 1: The 3D plant reconstruction and segmentation pipeline. Muti-view plants images from PhenoArch are binarised and used to reconstruct plants in3D. The 3D skeleton is extracted and separated into stem (central vertical elements) and leaves. 3D voxels are segmented by propagating skeleton segmentation

    Growth of the C4 dicot Flaveria bidentis: photosynthetic acclimation to low light through shifts in leaf anatomy and biochemistry

    Get PDF
    In C4 plants, acclimation to growth at low irradiance by means of anatomical and biochemical changes to leaf tissue is considered to be limited by the need for a close interaction and coordination between bundle sheath and mesophyll cells. Here differences in relative growth rate (RGR), gas exchange, carbon isotope discrimination, photosynthetic enzyme activity, and leaf anatomy in the C4 dicot Flaveria bidentis grown at a low (LI; 150 Όmol quanta m2 s−1) and medium (MI; 500 Όmol quanta m2 s−1) irradiance and with a 12 h photoperiod over 36 d were examined. RGRs measured using a 3D non-destructive imaging technique were consistently higher in MI plants. Rates of CO2 assimilation per leaf area measured at 1500 Όmmol quanta m2 s−1 were higher for MI than LI plants but did not differ on a mass basis. LI plants had lower Rubisco and phosphoenolpyruvate carboxylase activities and chlorophyll content on a leaf area basis. Bundle sheath leakiness of CO2 (ϕ) calculated from real-time carbon isotope discrimination was similar for MI and LI plants at high irradiance. ϕ increased at lower irradiances, but more so in MI plants, reflecting acclimation to low growth irradiance. Leaf thickness and vein density were greater in MI plants, and mesophyll surface area exposed to intercellular airspace (Sm) and bundle sheath surface area per unit leaf area (Sb) measured from leaf cross-sections were also both significantly greater in MI compared with LI leaves. Both mesophyll and bundle sheath conductance to CO2 diffusion were greater in MI compared with LI plants. Despite being a C4 species, F. bidentis is very plastic with respect to growth irradiance

    Leaf rolling in wheat

    No full text

    3D plant modelling via hyperspectral imaging

    No full text
    Plant phenomics research requires different types of sensors be employed to measure the physical traits of plant surface and to estimate the plant biomass. Of particular interest is the hyperspectral imaging device which captures wavelength indexed band images that characterise material properties of objects under study. In this paper, we introduce a proof of concept research that builds 3D plant model directly from hyperspectral images captured in a controlled lab environment. We show that hyperspectral imaging has shown clear advantages in segmenting plant from its background and is promising in generating comprehensive 3D plant models

    A novel mesh processing based technique for 3D plant analysis

    No full text
    Abstract Background In recent years, imaging based, automated, non-invasive, and non-destructive high-throughput plant phenotyping platforms have become popular tools for plant biology, underpinning the field of plant phenomics. Such platforms acquire and record large amounts of raw data that must be accurately and robustly calibrated, reconstructed, and analysed, requiring the development of sophisticated image understanding and quantification algorithms. The raw data can be processed in different ways, and the past few years have seen the emergence of two main approaches: 2D image processing and 3D mesh processing algorithms. Direct image quantification methods (usually 2D) dominate the current literature due to comparative simplicity. However, 3D mesh analysis provides the tremendous potential to accurately estimate specific morphological features cross-sectionally and monitor them over-time. Result In this paper, we present a novel 3D mesh based technique developed for temporal high-throughput plant phenomics and perform initial tests for the analysis of Gossypium hirsutum vegetative growth. Based on plant meshes previously reconstructed from multi-view images, the methodology involves several stages, including morphological mesh segmentation, phenotypic parameters estimation, and plant organs tracking over time. The initial study focuses on presenting and validating the accuracy of the methodology on dicotyledons such as cotton but we believe the approach will be more broadly applicable. This study involved applying our technique to a set of six Gossypium hirsutum (cotton) plants studied over four time-points. Manual measurements, performed for each plant at every time-point, were used to assess the accuracy of our pipeline and quantify the error on the morphological parameters estimated. Conclusion By directly comparing our automated mesh based quantitative data with manual measurements of individual stem height, leaf width and leaf length, we obtained the mean absolute errors of 9.34%, 5.75%, 8.78%, and correlation coefficients 0.88, 0.96, and 0.95 respectively. The temporal matching of leaves was accurate in 95% of the cases and the average execution time required to analyse a plant over four time-points was 4.9 minutes. The mesh processing based methodology is thus considered suitable for quantitative 4D monitoring of plant phenotypic features.</p

    A novel mesh processing based technique for 3D plant analysis

    Get PDF
    Background: In recent years, imaging based, automated, non-invasive, and non-destructive high-throughput plant phenotyping platforms have become popular tools for plant biology, underpinning the field of plant phenomics. Such platforms acquire and record large amounts of raw data that must be accurately and robustly calibrated, reconstructed, and analysed, requiring the development of sophisticated image understanding and quantification algorithms. The raw data can be processed in different ways, and the past few years have seen the emergence of two main approaches: 2D image processing and 3D mesh processing algorithms. Direct image quantification methods (usually 2D) dominate the current literature due to comparative simplicity. However, 3D mesh analysis provides the tremendous potential to accurately estimate specific morphological features cross-sectionally and monitor them over-time.Result: In this paper, we present a novel 3D mesh based technique developed for temporal high-throughput plant phenomics and perform initial tests for the analysis of Gossypium hirsutum vegetative growth. Based on plant meshes previously reconstructed from multi-view images, the methodology involves several stages, including morphological mesh segmentation, phenotypic parameters estimation, and plant organs tracking over time. The initial study focuses on presenting and validating the accuracy of the methodology on dicotyledons such as cotton but we believe the approach will be more broadly applicable. This study involved applying our technique to a set of six Gossypium hirsutum (cotton) plants studied over four time-points. Manual measurements, performed for each plant at every time-point, were used to assess the accuracy of our pipeline and quantify the error on the morphological parameters estimated.Conclusion: By directly comparing our automated mesh based quantitative data with manual measurements of individual stem height, leaf width and leaf length, we obtained the mean absolute errors of 9.34%, 5.75%, 8.78%, and correlation coefficients 0.88, 0.96, and 0.95 respectively. The temporal matching of leaves was accurate in 95% of the cases and the average execution time required to analyse a plant over four time-points was 4.9 minutes. The mesh processing based methodology is thus considered suitable for quantitative 4D monitoring of plant phenotypic features
    corecore